[ Wed Sep 28 02:16:33 2022 ] using warm up, epoch: 5
[ Wed Sep 28 02:16:48 2022 ] Parameters:
{'work_dir': 'work_dir/ntu60/csub/fc_vel', 'model_saved_name': 'work_dir/ntu60/csub/fc_vel/runs', 'config': 'config/nturgbd-cross-subject/fc_vel.yaml', 'phase': 'train', 'save_score': False, 'joint_label': [], 'seed': 1, 'log_interval': 100, 'save_interval': 1, 'save_epoch': 35, 'eval_interval': 5, 'ema': False, 'print_log': True, 'show_topk': [1, 5], 'feeder': 'feeders.feeder_ntu.Feeder', 'num_worker': 48, 'train_feeder_args': {'data_path': 'data/ntu60/NTU60_CS.npz', 'split': 'train', 'debug': False, 'random_choose': False, 'random_shift': False, 'random_move': False, 'window_size': 64, 'normalization': False, 'random_rot': True, 'p_interval': [0.5, 1], 'vel': True, 'bone': False}, 'test_feeder_args': {'data_path': 'data/ntu60/NTU60_CS.npz', 'split': 'test', 'window_size': 64, 'p_interval': [0.95], 'vel': True, 'bone': False, 'debug': False}, 'model': 'model.FC-Chains_L_multi_head_new_12_layers.Model', 'model_args': {'num_class': 60, 'num_point': 25, 'num_person': 2}, 'weights': None, 'ignore_weights': [], 'base_lr': 0.1, 'step': [90, 100], 'device': [2], 'optimizer': 'SGD', 'nesterov': True, 'momentum': 0.9, 'batch_size': 64, 'test_batch_size': 64, 'start_epoch': 0, 'num_epoch': 110, 'weight_decay': 0.0004, 'lr_decay_rate': 0.1, 'warm_up_epoch': 5}

[ Wed Sep 28 02:16:48 2022 ] # Parameters: 2082097
[ Wed Sep 28 02:16:48 2022 ] Training epoch: 1
[ Wed Sep 28 02:19:56 2022 ] 	Mean training loss: 2.5657. loss2: 0.0000. Mean training acc: 32.58%.
[ Wed Sep 28 02:19:56 2022 ] 	Time consumption: [Data]01%, [Network]98%
[ Wed Sep 28 02:19:56 2022 ] Eval epoch: 1
[ Wed Sep 28 02:20:26 2022 ] 	Mean test loss of 258 batches: 1.6879136897334757.
[ Wed Sep 28 02:20:26 2022 ] 	Top1: 51.93%
[ Wed Sep 28 02:20:26 2022 ] 	Top5: 84.41%
[ Wed Sep 28 02:20:26 2022 ] Training epoch: 2
[ Wed Sep 28 02:23:33 2022 ] 	Mean training loss: 1.5778. loss2: 0.0000. Mean training acc: 53.52%.
[ Wed Sep 28 02:23:33 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:23:33 2022 ] Eval epoch: 2
[ Wed Sep 28 02:24:02 2022 ] 	Mean test loss of 258 batches: 1.534830790157466.
[ Wed Sep 28 02:24:02 2022 ] 	Top1: 55.36%
[ Wed Sep 28 02:24:02 2022 ] 	Top5: 88.43%
[ Wed Sep 28 02:24:02 2022 ] Training epoch: 3
[ Wed Sep 28 02:27:09 2022 ] 	Mean training loss: 1.2433. loss2: 0.0000. Mean training acc: 62.54%.
[ Wed Sep 28 02:27:09 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:27:09 2022 ] Eval epoch: 3
[ Wed Sep 28 02:27:38 2022 ] 	Mean test loss of 258 batches: 1.2114770678124687.
[ Wed Sep 28 02:27:38 2022 ] 	Top1: 64.35%
[ Wed Sep 28 02:27:38 2022 ] 	Top5: 91.39%
[ Wed Sep 28 02:27:38 2022 ] Training epoch: 4
[ Wed Sep 28 02:30:45 2022 ] 	Mean training loss: 1.0836. loss2: 0.0000. Mean training acc: 66.63%.
[ Wed Sep 28 02:30:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:30:45 2022 ] Eval epoch: 4
[ Wed Sep 28 02:31:14 2022 ] 	Mean test loss of 258 batches: 1.118888567353404.
[ Wed Sep 28 02:31:14 2022 ] 	Top1: 67.16%
[ Wed Sep 28 02:31:14 2022 ] 	Top5: 91.70%
[ Wed Sep 28 02:31:14 2022 ] Training epoch: 5
[ Wed Sep 28 02:34:21 2022 ] 	Mean training loss: 0.9958. loss2: 0.0000. Mean training acc: 69.15%.
[ Wed Sep 28 02:34:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:34:21 2022 ] Eval epoch: 5
[ Wed Sep 28 02:34:51 2022 ] 	Mean test loss of 258 batches: 1.3034607093463573.
[ Wed Sep 28 02:34:51 2022 ] 	Top1: 62.07%
[ Wed Sep 28 02:34:51 2022 ] 	Top5: 90.54%
[ Wed Sep 28 02:34:51 2022 ] Training epoch: 6
[ Wed Sep 28 02:37:58 2022 ] 	Mean training loss: 0.8860. loss2: 0.0000. Mean training acc: 72.52%.
[ Wed Sep 28 02:37:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:37:58 2022 ] Eval epoch: 6
[ Wed Sep 28 02:38:28 2022 ] 	Mean test loss of 258 batches: 1.0186506404664166.
[ Wed Sep 28 02:38:28 2022 ] 	Top1: 68.65%
[ Wed Sep 28 02:38:28 2022 ] 	Top5: 93.66%
[ Wed Sep 28 02:38:28 2022 ] Training epoch: 7
[ Wed Sep 28 02:41:35 2022 ] 	Mean training loss: 0.8247. loss2: 0.0000. Mean training acc: 74.15%.
[ Wed Sep 28 02:41:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:41:35 2022 ] Eval epoch: 7
[ Wed Sep 28 02:42:04 2022 ] 	Mean test loss of 258 batches: 0.9943440235400385.
[ Wed Sep 28 02:42:04 2022 ] 	Top1: 70.02%
[ Wed Sep 28 02:42:04 2022 ] 	Top5: 93.52%
[ Wed Sep 28 02:42:04 2022 ] Training epoch: 8
[ Wed Sep 28 02:45:12 2022 ] 	Mean training loss: 0.7891. loss2: 0.0000. Mean training acc: 75.46%.
[ Wed Sep 28 02:45:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:45:12 2022 ] Eval epoch: 8
[ Wed Sep 28 02:45:41 2022 ] 	Mean test loss of 258 batches: 0.8518507886071538.
[ Wed Sep 28 02:45:41 2022 ] 	Top1: 74.17%
[ Wed Sep 28 02:45:41 2022 ] 	Top5: 94.80%
[ Wed Sep 28 02:45:41 2022 ] Training epoch: 9
[ Wed Sep 28 02:48:48 2022 ] 	Mean training loss: 0.7457. loss2: 0.0000. Mean training acc: 76.58%.
[ Wed Sep 28 02:48:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:48:48 2022 ] Eval epoch: 9
[ Wed Sep 28 02:49:17 2022 ] 	Mean test loss of 258 batches: 0.9235220918821734.
[ Wed Sep 28 02:49:17 2022 ] 	Top1: 73.88%
[ Wed Sep 28 02:49:18 2022 ] 	Top5: 94.09%
[ Wed Sep 28 02:49:18 2022 ] Training epoch: 10
[ Wed Sep 28 02:52:25 2022 ] 	Mean training loss: 0.7326. loss2: 0.0000. Mean training acc: 76.98%.
[ Wed Sep 28 02:52:25 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:52:25 2022 ] Eval epoch: 10
[ Wed Sep 28 02:52:54 2022 ] 	Mean test loss of 258 batches: 0.8691765460395073.
[ Wed Sep 28 02:52:54 2022 ] 	Top1: 73.55%
[ Wed Sep 28 02:52:54 2022 ] 	Top5: 94.80%
[ Wed Sep 28 02:52:54 2022 ] Training epoch: 11
[ Wed Sep 28 02:56:01 2022 ] 	Mean training loss: 0.7107. loss2: 0.0000. Mean training acc: 77.73%.
[ Wed Sep 28 02:56:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:56:01 2022 ] Eval epoch: 11
[ Wed Sep 28 02:56:30 2022 ] 	Mean test loss of 258 batches: 1.0012516895698946.
[ Wed Sep 28 02:56:30 2022 ] 	Top1: 70.64%
[ Wed Sep 28 02:56:30 2022 ] 	Top5: 93.12%
[ Wed Sep 28 02:56:30 2022 ] Training epoch: 12
[ Wed Sep 28 02:59:38 2022 ] 	Mean training loss: 0.7013. loss2: 0.0000. Mean training acc: 77.79%.
[ Wed Sep 28 02:59:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 02:59:38 2022 ] Eval epoch: 12
[ Wed Sep 28 03:00:07 2022 ] 	Mean test loss of 258 batches: 0.8920507446046948.
[ Wed Sep 28 03:00:07 2022 ] 	Top1: 73.28%
[ Wed Sep 28 03:00:07 2022 ] 	Top5: 94.67%
[ Wed Sep 28 03:00:07 2022 ] Training epoch: 13
[ Wed Sep 28 03:03:14 2022 ] 	Mean training loss: 0.6864. loss2: 0.0000. Mean training acc: 78.37%.
[ Wed Sep 28 03:03:14 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:03:14 2022 ] Eval epoch: 13
[ Wed Sep 28 03:03:43 2022 ] 	Mean test loss of 258 batches: 0.8563921399587808.
[ Wed Sep 28 03:03:43 2022 ] 	Top1: 74.03%
[ Wed Sep 28 03:03:44 2022 ] 	Top5: 94.72%
[ Wed Sep 28 03:03:44 2022 ] Training epoch: 14
[ Wed Sep 28 03:06:50 2022 ] 	Mean training loss: 0.6723. loss2: 0.0000. Mean training acc: 78.73%.
[ Wed Sep 28 03:06:50 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:06:50 2022 ] Eval epoch: 14
[ Wed Sep 28 03:07:20 2022 ] 	Mean test loss of 258 batches: 0.8917104180238044.
[ Wed Sep 28 03:07:20 2022 ] 	Top1: 73.72%
[ Wed Sep 28 03:07:20 2022 ] 	Top5: 95.49%
[ Wed Sep 28 03:07:20 2022 ] Training epoch: 15
[ Wed Sep 28 03:10:27 2022 ] 	Mean training loss: 0.6516. loss2: 0.0000. Mean training acc: 79.42%.
[ Wed Sep 28 03:10:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:10:27 2022 ] Eval epoch: 15
[ Wed Sep 28 03:10:56 2022 ] 	Mean test loss of 258 batches: 0.9328586832266446.
[ Wed Sep 28 03:10:56 2022 ] 	Top1: 72.43%
[ Wed Sep 28 03:10:57 2022 ] 	Top5: 93.95%
[ Wed Sep 28 03:10:57 2022 ] Training epoch: 16
[ Wed Sep 28 03:14:04 2022 ] 	Mean training loss: 0.6424. loss2: 0.0000. Mean training acc: 79.68%.
[ Wed Sep 28 03:14:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:14:04 2022 ] Eval epoch: 16
[ Wed Sep 28 03:14:33 2022 ] 	Mean test loss of 258 batches: 0.9667175971260367.
[ Wed Sep 28 03:14:33 2022 ] 	Top1: 71.04%
[ Wed Sep 28 03:14:33 2022 ] 	Top5: 95.22%
[ Wed Sep 28 03:14:33 2022 ] Training epoch: 17
[ Wed Sep 28 03:17:40 2022 ] 	Mean training loss: 0.6364. loss2: 0.0000. Mean training acc: 80.01%.
[ Wed Sep 28 03:17:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:17:40 2022 ] Eval epoch: 17
[ Wed Sep 28 03:18:09 2022 ] 	Mean test loss of 258 batches: 0.8275447399463765.
[ Wed Sep 28 03:18:10 2022 ] 	Top1: 75.40%
[ Wed Sep 28 03:18:10 2022 ] 	Top5: 95.69%
[ Wed Sep 28 03:18:10 2022 ] Training epoch: 18
[ Wed Sep 28 03:21:17 2022 ] 	Mean training loss: 0.6205. loss2: 0.0000. Mean training acc: 80.38%.
[ Wed Sep 28 03:21:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:21:17 2022 ] Eval epoch: 18
[ Wed Sep 28 03:21:46 2022 ] 	Mean test loss of 258 batches: 0.8565916663916536.
[ Wed Sep 28 03:21:46 2022 ] 	Top1: 73.69%
[ Wed Sep 28 03:21:46 2022 ] 	Top5: 95.08%
[ Wed Sep 28 03:21:46 2022 ] Training epoch: 19
[ Wed Sep 28 03:24:53 2022 ] 	Mean training loss: 0.6195. loss2: 0.0000. Mean training acc: 80.50%.
[ Wed Sep 28 03:24:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:24:53 2022 ] Eval epoch: 19
[ Wed Sep 28 03:25:23 2022 ] 	Mean test loss of 258 batches: 0.7396993290546329.
[ Wed Sep 28 03:25:23 2022 ] 	Top1: 77.87%
[ Wed Sep 28 03:25:23 2022 ] 	Top5: 96.02%
[ Wed Sep 28 03:25:23 2022 ] Training epoch: 20
[ Wed Sep 28 03:28:30 2022 ] 	Mean training loss: 0.6040. loss2: 0.0000. Mean training acc: 80.97%.
[ Wed Sep 28 03:28:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:28:30 2022 ] Eval epoch: 20
[ Wed Sep 28 03:29:00 2022 ] 	Mean test loss of 258 batches: 0.9512004511531933.
[ Wed Sep 28 03:29:00 2022 ] 	Top1: 72.04%
[ Wed Sep 28 03:29:00 2022 ] 	Top5: 93.49%
[ Wed Sep 28 03:29:00 2022 ] Training epoch: 21
[ Wed Sep 28 03:32:07 2022 ] 	Mean training loss: 0.6004. loss2: 0.0000. Mean training acc: 81.22%.
[ Wed Sep 28 03:32:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:32:07 2022 ] Eval epoch: 21
[ Wed Sep 28 03:32:36 2022 ] 	Mean test loss of 258 batches: 0.8146058937837911.
[ Wed Sep 28 03:32:36 2022 ] 	Top1: 76.45%
[ Wed Sep 28 03:32:36 2022 ] 	Top5: 94.77%
[ Wed Sep 28 03:32:36 2022 ] Training epoch: 22
[ Wed Sep 28 03:35:44 2022 ] 	Mean training loss: 0.5947. loss2: 0.0000. Mean training acc: 81.23%.
[ Wed Sep 28 03:35:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:35:44 2022 ] Eval epoch: 22
[ Wed Sep 28 03:36:13 2022 ] 	Mean test loss of 258 batches: 0.7667116908602012.
[ Wed Sep 28 03:36:13 2022 ] 	Top1: 78.03%
[ Wed Sep 28 03:36:13 2022 ] 	Top5: 95.38%
[ Wed Sep 28 03:36:13 2022 ] Training epoch: 23
[ Wed Sep 28 03:39:20 2022 ] 	Mean training loss: 0.5973. loss2: 0.0000. Mean training acc: 81.38%.
[ Wed Sep 28 03:39:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:39:20 2022 ] Eval epoch: 23
[ Wed Sep 28 03:39:50 2022 ] 	Mean test loss of 258 batches: 0.9224752123041671.
[ Wed Sep 28 03:39:50 2022 ] 	Top1: 73.62%
[ Wed Sep 28 03:39:50 2022 ] 	Top5: 94.78%
[ Wed Sep 28 03:39:50 2022 ] Training epoch: 24
[ Wed Sep 28 03:42:57 2022 ] 	Mean training loss: 0.5851. loss2: 0.0000. Mean training acc: 81.44%.
[ Wed Sep 28 03:42:57 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:42:57 2022 ] Eval epoch: 24
[ Wed Sep 28 03:43:27 2022 ] 	Mean test loss of 258 batches: 0.832323091667752.
[ Wed Sep 28 03:43:27 2022 ] 	Top1: 75.36%
[ Wed Sep 28 03:43:27 2022 ] 	Top5: 95.08%
[ Wed Sep 28 03:43:27 2022 ] Training epoch: 25
[ Wed Sep 28 03:46:34 2022 ] 	Mean training loss: 0.5744. loss2: 0.0000. Mean training acc: 81.97%.
[ Wed Sep 28 03:46:34 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:46:34 2022 ] Eval epoch: 25
[ Wed Sep 28 03:47:03 2022 ] 	Mean test loss of 258 batches: 0.8611124236454335.
[ Wed Sep 28 03:47:03 2022 ] 	Top1: 75.26%
[ Wed Sep 28 03:47:04 2022 ] 	Top5: 94.54%
[ Wed Sep 28 03:47:04 2022 ] Training epoch: 26
[ Wed Sep 28 03:50:11 2022 ] 	Mean training loss: 0.5876. loss2: 0.0000. Mean training acc: 81.49%.
[ Wed Sep 28 03:50:11 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:50:11 2022 ] Eval epoch: 26
[ Wed Sep 28 03:50:40 2022 ] 	Mean test loss of 258 batches: 0.7102522354486377.
[ Wed Sep 28 03:50:41 2022 ] 	Top1: 79.54%
[ Wed Sep 28 03:50:41 2022 ] 	Top5: 96.54%
[ Wed Sep 28 03:50:41 2022 ] Training epoch: 27
[ Wed Sep 28 03:53:48 2022 ] 	Mean training loss: 0.5809. loss2: 0.0000. Mean training acc: 81.61%.
[ Wed Sep 28 03:53:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:53:48 2022 ] Eval epoch: 27
[ Wed Sep 28 03:54:17 2022 ] 	Mean test loss of 258 batches: 0.7283710481584534.
[ Wed Sep 28 03:54:18 2022 ] 	Top1: 77.85%
[ Wed Sep 28 03:54:18 2022 ] 	Top5: 96.30%
[ Wed Sep 28 03:54:18 2022 ] Training epoch: 28
[ Wed Sep 28 03:57:25 2022 ] 	Mean training loss: 0.5751. loss2: 0.0000. Mean training acc: 81.79%.
[ Wed Sep 28 03:57:25 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 03:57:25 2022 ] Eval epoch: 28
[ Wed Sep 28 03:57:54 2022 ] 	Mean test loss of 258 batches: 0.9284202484197395.
[ Wed Sep 28 03:57:54 2022 ] 	Top1: 73.15%
[ Wed Sep 28 03:57:54 2022 ] 	Top5: 94.64%
[ Wed Sep 28 03:57:54 2022 ] Training epoch: 29
[ Wed Sep 28 04:01:01 2022 ] 	Mean training loss: 0.5722. loss2: 0.0000. Mean training acc: 81.87%.
[ Wed Sep 28 04:01:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:01:01 2022 ] Eval epoch: 29
[ Wed Sep 28 04:01:31 2022 ] 	Mean test loss of 258 batches: 1.1037997476352277.
[ Wed Sep 28 04:01:31 2022 ] 	Top1: 70.80%
[ Wed Sep 28 04:01:31 2022 ] 	Top5: 93.75%
[ Wed Sep 28 04:01:31 2022 ] Training epoch: 30
[ Wed Sep 28 04:04:38 2022 ] 	Mean training loss: 0.5639. loss2: 0.0000. Mean training acc: 82.31%.
[ Wed Sep 28 04:04:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:04:38 2022 ] Eval epoch: 30
[ Wed Sep 28 04:05:07 2022 ] 	Mean test loss of 258 batches: 0.8438056299621745.
[ Wed Sep 28 04:05:07 2022 ] 	Top1: 75.65%
[ Wed Sep 28 04:05:07 2022 ] 	Top5: 95.12%
[ Wed Sep 28 04:05:07 2022 ] Training epoch: 31
[ Wed Sep 28 04:08:15 2022 ] 	Mean training loss: 0.5694. loss2: 0.0000. Mean training acc: 82.05%.
[ Wed Sep 28 04:08:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:08:15 2022 ] Eval epoch: 31
[ Wed Sep 28 04:08:44 2022 ] 	Mean test loss of 258 batches: 0.840403208436892.
[ Wed Sep 28 04:08:44 2022 ] 	Top1: 74.92%
[ Wed Sep 28 04:08:44 2022 ] 	Top5: 94.87%
[ Wed Sep 28 04:08:44 2022 ] Training epoch: 32
[ Wed Sep 28 04:11:51 2022 ] 	Mean training loss: 0.5688. loss2: 0.0000. Mean training acc: 82.06%.
[ Wed Sep 28 04:11:51 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:11:51 2022 ] Eval epoch: 32
[ Wed Sep 28 04:12:21 2022 ] 	Mean test loss of 258 batches: 0.7514197855263718.
[ Wed Sep 28 04:12:21 2022 ] 	Top1: 78.30%
[ Wed Sep 28 04:12:21 2022 ] 	Top5: 95.91%
[ Wed Sep 28 04:12:21 2022 ] Training epoch: 33
[ Wed Sep 28 04:15:28 2022 ] 	Mean training loss: 0.5717. loss2: 0.0000. Mean training acc: 82.02%.
[ Wed Sep 28 04:15:28 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:15:28 2022 ] Eval epoch: 33
[ Wed Sep 28 04:15:57 2022 ] 	Mean test loss of 258 batches: 0.7989552764929542.
[ Wed Sep 28 04:15:57 2022 ] 	Top1: 76.49%
[ Wed Sep 28 04:15:58 2022 ] 	Top5: 95.28%
[ Wed Sep 28 04:15:58 2022 ] Training epoch: 34
[ Wed Sep 28 04:19:05 2022 ] 	Mean training loss: 0.5624. loss2: 0.0000. Mean training acc: 82.27%.
[ Wed Sep 28 04:19:05 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:19:05 2022 ] Eval epoch: 34
[ Wed Sep 28 04:19:34 2022 ] 	Mean test loss of 258 batches: 0.7489431466936141.
[ Wed Sep 28 04:19:34 2022 ] 	Top1: 78.10%
[ Wed Sep 28 04:19:34 2022 ] 	Top5: 96.12%
[ Wed Sep 28 04:19:34 2022 ] Training epoch: 35
[ Wed Sep 28 04:22:41 2022 ] 	Mean training loss: 0.5561. loss2: 0.0000. Mean training acc: 82.45%.
[ Wed Sep 28 04:22:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:22:41 2022 ] Eval epoch: 35
[ Wed Sep 28 04:23:10 2022 ] 	Mean test loss of 258 batches: 0.8188581731199294.
[ Wed Sep 28 04:23:11 2022 ] 	Top1: 75.88%
[ Wed Sep 28 04:23:11 2022 ] 	Top5: 94.73%
[ Wed Sep 28 04:23:11 2022 ] Training epoch: 36
[ Wed Sep 28 04:26:18 2022 ] 	Mean training loss: 0.5587. loss2: 0.0000. Mean training acc: 82.54%.
[ Wed Sep 28 04:26:18 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:26:18 2022 ] Eval epoch: 36
[ Wed Sep 28 04:26:47 2022 ] 	Mean test loss of 258 batches: 1.4309346729008727.
[ Wed Sep 28 04:26:47 2022 ] 	Top1: 61.52%
[ Wed Sep 28 04:26:47 2022 ] 	Top5: 89.93%
[ Wed Sep 28 04:26:47 2022 ] Training epoch: 37
[ Wed Sep 28 04:29:54 2022 ] 	Mean training loss: 0.5600. loss2: 0.0000. Mean training acc: 82.34%.
[ Wed Sep 28 04:29:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:29:55 2022 ] Eval epoch: 37
[ Wed Sep 28 04:30:24 2022 ] 	Mean test loss of 258 batches: 0.8021998124760251.
[ Wed Sep 28 04:30:24 2022 ] 	Top1: 75.58%
[ Wed Sep 28 04:30:24 2022 ] 	Top5: 95.59%
[ Wed Sep 28 04:30:24 2022 ] Training epoch: 38
[ Wed Sep 28 04:33:31 2022 ] 	Mean training loss: 0.5532. loss2: 0.0000. Mean training acc: 82.63%.
[ Wed Sep 28 04:33:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:33:31 2022 ] Eval epoch: 38
[ Wed Sep 28 04:34:00 2022 ] 	Mean test loss of 258 batches: 0.7503463874260584.
[ Wed Sep 28 04:34:00 2022 ] 	Top1: 77.45%
[ Wed Sep 28 04:34:00 2022 ] 	Top5: 95.57%
[ Wed Sep 28 04:34:00 2022 ] Training epoch: 39
[ Wed Sep 28 04:37:08 2022 ] 	Mean training loss: 0.5478. loss2: 0.0000. Mean training acc: 82.81%.
[ Wed Sep 28 04:37:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:37:08 2022 ] Eval epoch: 39
[ Wed Sep 28 04:37:37 2022 ] 	Mean test loss of 258 batches: 0.777317544517591.
[ Wed Sep 28 04:37:37 2022 ] 	Top1: 76.99%
[ Wed Sep 28 04:37:37 2022 ] 	Top5: 95.65%
[ Wed Sep 28 04:37:37 2022 ] Training epoch: 40
[ Wed Sep 28 04:40:45 2022 ] 	Mean training loss: 0.5514. loss2: 0.0000. Mean training acc: 82.61%.
[ Wed Sep 28 04:40:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:40:45 2022 ] Eval epoch: 40
[ Wed Sep 28 04:41:14 2022 ] 	Mean test loss of 258 batches: 0.7406759163899015.
[ Wed Sep 28 04:41:14 2022 ] 	Top1: 77.73%
[ Wed Sep 28 04:41:14 2022 ] 	Top5: 96.02%
[ Wed Sep 28 04:41:14 2022 ] Training epoch: 41
[ Wed Sep 28 04:44:22 2022 ] 	Mean training loss: 0.5473. loss2: 0.0000. Mean training acc: 82.63%.
[ Wed Sep 28 04:44:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:44:22 2022 ] Eval epoch: 41
[ Wed Sep 28 04:44:51 2022 ] 	Mean test loss of 258 batches: 0.727788760565048.
[ Wed Sep 28 04:44:51 2022 ] 	Top1: 78.39%
[ Wed Sep 28 04:44:52 2022 ] 	Top5: 95.52%
[ Wed Sep 28 04:44:52 2022 ] Training epoch: 42
[ Wed Sep 28 04:47:59 2022 ] 	Mean training loss: 0.5515. loss2: 0.0000. Mean training acc: 82.59%.
[ Wed Sep 28 04:47:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:47:59 2022 ] Eval epoch: 42
[ Wed Sep 28 04:48:28 2022 ] 	Mean test loss of 258 batches: 0.7089506700172905.
[ Wed Sep 28 04:48:28 2022 ] 	Top1: 78.82%
[ Wed Sep 28 04:48:28 2022 ] 	Top5: 96.20%
[ Wed Sep 28 04:48:28 2022 ] Training epoch: 43
[ Wed Sep 28 04:51:36 2022 ] 	Mean training loss: 0.5439. loss2: 0.0000. Mean training acc: 82.82%.
[ Wed Sep 28 04:51:36 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:51:36 2022 ] Eval epoch: 43
[ Wed Sep 28 04:52:05 2022 ] 	Mean test loss of 258 batches: 0.7870828078467716.
[ Wed Sep 28 04:52:05 2022 ] 	Top1: 76.66%
[ Wed Sep 28 04:52:05 2022 ] 	Top5: 95.78%
[ Wed Sep 28 04:52:05 2022 ] Training epoch: 44
[ Wed Sep 28 04:55:12 2022 ] 	Mean training loss: 0.5466. loss2: 0.0000. Mean training acc: 82.66%.
[ Wed Sep 28 04:55:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:55:13 2022 ] Eval epoch: 44
[ Wed Sep 28 04:55:42 2022 ] 	Mean test loss of 258 batches: 0.7098745285540589.
[ Wed Sep 28 04:55:42 2022 ] 	Top1: 79.02%
[ Wed Sep 28 04:55:42 2022 ] 	Top5: 96.07%
[ Wed Sep 28 04:55:42 2022 ] Training epoch: 45
[ Wed Sep 28 04:58:49 2022 ] 	Mean training loss: 0.5394. loss2: 0.0000. Mean training acc: 82.98%.
[ Wed Sep 28 04:58:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 04:58:49 2022 ] Eval epoch: 45
[ Wed Sep 28 04:59:19 2022 ] 	Mean test loss of 258 batches: 0.7555944759023282.
[ Wed Sep 28 04:59:19 2022 ] 	Top1: 77.19%
[ Wed Sep 28 04:59:19 2022 ] 	Top5: 95.77%
[ Wed Sep 28 04:59:19 2022 ] Training epoch: 46
[ Wed Sep 28 05:02:27 2022 ] 	Mean training loss: 0.5381. loss2: 0.0000. Mean training acc: 83.07%.
[ Wed Sep 28 05:02:27 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:02:27 2022 ] Eval epoch: 46
[ Wed Sep 28 05:02:56 2022 ] 	Mean test loss of 258 batches: 0.7795148848794228.
[ Wed Sep 28 05:02:56 2022 ] 	Top1: 76.39%
[ Wed Sep 28 05:02:56 2022 ] 	Top5: 95.72%
[ Wed Sep 28 05:02:56 2022 ] Training epoch: 47
[ Wed Sep 28 05:06:04 2022 ] 	Mean training loss: 0.5410. loss2: 0.0000. Mean training acc: 83.04%.
[ Wed Sep 28 05:06:04 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:06:04 2022 ] Eval epoch: 47
[ Wed Sep 28 05:06:33 2022 ] 	Mean test loss of 258 batches: 0.7231145393709804.
[ Wed Sep 28 05:06:33 2022 ] 	Top1: 78.57%
[ Wed Sep 28 05:06:33 2022 ] 	Top5: 95.94%
[ Wed Sep 28 05:06:33 2022 ] Training epoch: 48
[ Wed Sep 28 05:09:40 2022 ] 	Mean training loss: 0.5376. loss2: 0.0000. Mean training acc: 83.23%.
[ Wed Sep 28 05:09:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:09:40 2022 ] Eval epoch: 48
[ Wed Sep 28 05:10:10 2022 ] 	Mean test loss of 258 batches: 0.9624095875625462.
[ Wed Sep 28 05:10:10 2022 ] 	Top1: 72.70%
[ Wed Sep 28 05:10:10 2022 ] 	Top5: 92.56%
[ Wed Sep 28 05:10:10 2022 ] Training epoch: 49
[ Wed Sep 28 05:13:17 2022 ] 	Mean training loss: 0.5394. loss2: 0.0000. Mean training acc: 82.88%.
[ Wed Sep 28 05:13:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:13:17 2022 ] Eval epoch: 49
[ Wed Sep 28 05:13:47 2022 ] 	Mean test loss of 258 batches: 0.6744178657152856.
[ Wed Sep 28 05:13:47 2022 ] 	Top1: 80.21%
[ Wed Sep 28 05:13:47 2022 ] 	Top5: 96.15%
[ Wed Sep 28 05:13:47 2022 ] Training epoch: 50
[ Wed Sep 28 05:16:54 2022 ] 	Mean training loss: 0.5386. loss2: 0.0000. Mean training acc: 82.93%.
[ Wed Sep 28 05:16:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:16:54 2022 ] Eval epoch: 50
[ Wed Sep 28 05:17:23 2022 ] 	Mean test loss of 258 batches: 0.7994787489721017.
[ Wed Sep 28 05:17:23 2022 ] 	Top1: 75.95%
[ Wed Sep 28 05:17:23 2022 ] 	Top5: 95.66%
[ Wed Sep 28 05:17:23 2022 ] Training epoch: 51
[ Wed Sep 28 05:20:31 2022 ] 	Mean training loss: 0.5438. loss2: 0.0000. Mean training acc: 82.86%.
[ Wed Sep 28 05:20:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:20:31 2022 ] Eval epoch: 51
[ Wed Sep 28 05:21:00 2022 ] 	Mean test loss of 258 batches: 0.8840072161929552.
[ Wed Sep 28 05:21:00 2022 ] 	Top1: 75.67%
[ Wed Sep 28 05:21:00 2022 ] 	Top5: 94.07%
[ Wed Sep 28 05:21:00 2022 ] Training epoch: 52
[ Wed Sep 28 05:24:08 2022 ] 	Mean training loss: 0.5378. loss2: 0.0000. Mean training acc: 83.09%.
[ Wed Sep 28 05:24:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:24:08 2022 ] Eval epoch: 52
[ Wed Sep 28 05:24:37 2022 ] 	Mean test loss of 258 batches: 0.7495801840410676.
[ Wed Sep 28 05:24:37 2022 ] 	Top1: 78.31%
[ Wed Sep 28 05:24:37 2022 ] 	Top5: 96.12%
[ Wed Sep 28 05:24:37 2022 ] Training epoch: 53
[ Wed Sep 28 05:27:44 2022 ] 	Mean training loss: 0.5335. loss2: 0.0000. Mean training acc: 83.15%.
[ Wed Sep 28 05:27:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:27:44 2022 ] Eval epoch: 53
[ Wed Sep 28 05:28:14 2022 ] 	Mean test loss of 258 batches: 0.7927947502958682.
[ Wed Sep 28 05:28:14 2022 ] 	Top1: 76.98%
[ Wed Sep 28 05:28:14 2022 ] 	Top5: 95.16%
[ Wed Sep 28 05:28:14 2022 ] Training epoch: 54
[ Wed Sep 28 05:31:21 2022 ] 	Mean training loss: 0.5415. loss2: 0.0000. Mean training acc: 82.97%.
[ Wed Sep 28 05:31:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:31:22 2022 ] Eval epoch: 54
[ Wed Sep 28 05:31:51 2022 ] 	Mean test loss of 258 batches: 0.7502551539692768.
[ Wed Sep 28 05:31:51 2022 ] 	Top1: 77.24%
[ Wed Sep 28 05:31:51 2022 ] 	Top5: 95.71%
[ Wed Sep 28 05:31:51 2022 ] Training epoch: 55
[ Wed Sep 28 05:34:58 2022 ] 	Mean training loss: 0.5333. loss2: 0.0000. Mean training acc: 83.03%.
[ Wed Sep 28 05:34:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:34:58 2022 ] Eval epoch: 55
[ Wed Sep 28 05:35:28 2022 ] 	Mean test loss of 258 batches: 0.728008203266203.
[ Wed Sep 28 05:35:28 2022 ] 	Top1: 77.81%
[ Wed Sep 28 05:35:28 2022 ] 	Top5: 96.22%
[ Wed Sep 28 05:35:28 2022 ] Training epoch: 56
[ Wed Sep 28 05:38:35 2022 ] 	Mean training loss: 0.5396. loss2: 0.0000. Mean training acc: 83.09%.
[ Wed Sep 28 05:38:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:38:35 2022 ] Eval epoch: 56
[ Wed Sep 28 05:39:04 2022 ] 	Mean test loss of 258 batches: 0.8753799352534982.
[ Wed Sep 28 05:39:05 2022 ] 	Top1: 74.51%
[ Wed Sep 28 05:39:05 2022 ] 	Top5: 93.78%
[ Wed Sep 28 05:39:05 2022 ] Training epoch: 57
[ Wed Sep 28 05:42:12 2022 ] 	Mean training loss: 0.5332. loss2: 0.0000. Mean training acc: 83.37%.
[ Wed Sep 28 05:42:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:42:12 2022 ] Eval epoch: 57
[ Wed Sep 28 05:42:42 2022 ] 	Mean test loss of 258 batches: 0.6782100995381674.
[ Wed Sep 28 05:42:42 2022 ] 	Top1: 79.21%
[ Wed Sep 28 05:42:42 2022 ] 	Top5: 96.38%
[ Wed Sep 28 05:42:42 2022 ] Training epoch: 58
[ Wed Sep 28 05:45:49 2022 ] 	Mean training loss: 0.5315. loss2: 0.0000. Mean training acc: 83.29%.
[ Wed Sep 28 05:45:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:45:49 2022 ] Eval epoch: 58
[ Wed Sep 28 05:46:19 2022 ] 	Mean test loss of 258 batches: 0.8039896609478219.
[ Wed Sep 28 05:46:19 2022 ] 	Top1: 75.90%
[ Wed Sep 28 05:46:19 2022 ] 	Top5: 95.68%
[ Wed Sep 28 05:46:19 2022 ] Training epoch: 59
[ Wed Sep 28 05:49:26 2022 ] 	Mean training loss: 0.5349. loss2: 0.0000. Mean training acc: 83.06%.
[ Wed Sep 28 05:49:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:49:26 2022 ] Eval epoch: 59
[ Wed Sep 28 05:49:56 2022 ] 	Mean test loss of 258 batches: 0.6958586783256642.
[ Wed Sep 28 05:49:56 2022 ] 	Top1: 79.04%
[ Wed Sep 28 05:49:56 2022 ] 	Top5: 96.37%
[ Wed Sep 28 05:49:56 2022 ] Training epoch: 60
[ Wed Sep 28 05:53:03 2022 ] 	Mean training loss: 0.5333. loss2: 0.0000. Mean training acc: 83.21%.
[ Wed Sep 28 05:53:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:53:04 2022 ] Eval epoch: 60
[ Wed Sep 28 05:53:33 2022 ] 	Mean test loss of 258 batches: 0.7545064206040183.
[ Wed Sep 28 05:53:33 2022 ] 	Top1: 77.49%
[ Wed Sep 28 05:53:33 2022 ] 	Top5: 95.80%
[ Wed Sep 28 05:53:33 2022 ] Training epoch: 61
[ Wed Sep 28 05:56:41 2022 ] 	Mean training loss: 0.5297. loss2: 0.0000. Mean training acc: 83.25%.
[ Wed Sep 28 05:56:41 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 05:56:41 2022 ] Eval epoch: 61
[ Wed Sep 28 05:57:10 2022 ] 	Mean test loss of 258 batches: 0.7777360349662544.
[ Wed Sep 28 05:57:10 2022 ] 	Top1: 77.17%
[ Wed Sep 28 05:57:10 2022 ] 	Top5: 95.80%
[ Wed Sep 28 05:57:10 2022 ] Training epoch: 62
[ Wed Sep 28 06:00:17 2022 ] 	Mean training loss: 0.5335. loss2: 0.0000. Mean training acc: 83.04%.
[ Wed Sep 28 06:00:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:00:17 2022 ] Eval epoch: 62
[ Wed Sep 28 06:00:47 2022 ] 	Mean test loss of 258 batches: 0.882582028244817.
[ Wed Sep 28 06:00:47 2022 ] 	Top1: 74.63%
[ Wed Sep 28 06:00:47 2022 ] 	Top5: 94.76%
[ Wed Sep 28 06:00:47 2022 ] Training epoch: 63
[ Wed Sep 28 06:03:54 2022 ] 	Mean training loss: 0.5324. loss2: 0.0000. Mean training acc: 83.22%.
[ Wed Sep 28 06:03:54 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:03:54 2022 ] Eval epoch: 63
[ Wed Sep 28 06:04:23 2022 ] 	Mean test loss of 258 batches: 0.8024756760560265.
[ Wed Sep 28 06:04:23 2022 ] 	Top1: 76.96%
[ Wed Sep 28 06:04:23 2022 ] 	Top5: 95.37%
[ Wed Sep 28 06:04:24 2022 ] Training epoch: 64
[ Wed Sep 28 06:07:31 2022 ] 	Mean training loss: 0.5280. loss2: 0.0000. Mean training acc: 83.55%.
[ Wed Sep 28 06:07:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:07:31 2022 ] Eval epoch: 64
[ Wed Sep 28 06:08:00 2022 ] 	Mean test loss of 258 batches: 0.7333602850404821.
[ Wed Sep 28 06:08:00 2022 ] 	Top1: 78.83%
[ Wed Sep 28 06:08:00 2022 ] 	Top5: 95.84%
[ Wed Sep 28 06:08:00 2022 ] Training epoch: 65
[ Wed Sep 28 06:11:07 2022 ] 	Mean training loss: 0.5265. loss2: 0.0000. Mean training acc: 83.45%.
[ Wed Sep 28 06:11:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:11:07 2022 ] Eval epoch: 65
[ Wed Sep 28 06:11:37 2022 ] 	Mean test loss of 258 batches: 0.71662236212991.
[ Wed Sep 28 06:11:37 2022 ] 	Top1: 78.71%
[ Wed Sep 28 06:11:37 2022 ] 	Top5: 95.94%
[ Wed Sep 28 06:11:37 2022 ] Training epoch: 66
[ Wed Sep 28 06:14:44 2022 ] 	Mean training loss: 0.5339. loss2: 0.0000. Mean training acc: 83.36%.
[ Wed Sep 28 06:14:44 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:14:44 2022 ] Eval epoch: 66
[ Wed Sep 28 06:15:14 2022 ] 	Mean test loss of 258 batches: 0.7096764609910721.
[ Wed Sep 28 06:15:14 2022 ] 	Top1: 79.28%
[ Wed Sep 28 06:15:14 2022 ] 	Top5: 95.63%
[ Wed Sep 28 06:15:14 2022 ] Training epoch: 67
[ Wed Sep 28 06:18:22 2022 ] 	Mean training loss: 0.5256. loss2: 0.0000. Mean training acc: 83.44%.
[ Wed Sep 28 06:18:22 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:18:22 2022 ] Eval epoch: 67
[ Wed Sep 28 06:18:52 2022 ] 	Mean test loss of 258 batches: 0.7751910195563191.
[ Wed Sep 28 06:18:52 2022 ] 	Top1: 77.59%
[ Wed Sep 28 06:18:52 2022 ] 	Top5: 95.43%
[ Wed Sep 28 06:18:52 2022 ] Training epoch: 68
[ Wed Sep 28 06:21:59 2022 ] 	Mean training loss: 0.5260. loss2: 0.0000. Mean training acc: 83.37%.
[ Wed Sep 28 06:21:59 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:21:59 2022 ] Eval epoch: 68
[ Wed Sep 28 06:22:29 2022 ] 	Mean test loss of 258 batches: 0.7412932062333868.
[ Wed Sep 28 06:22:29 2022 ] 	Top1: 78.47%
[ Wed Sep 28 06:22:29 2022 ] 	Top5: 95.51%
[ Wed Sep 28 06:22:29 2022 ] Training epoch: 69
[ Wed Sep 28 06:25:38 2022 ] 	Mean training loss: 0.5338. loss2: 0.0000. Mean training acc: 83.08%.
[ Wed Sep 28 06:25:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:25:38 2022 ] Eval epoch: 69
[ Wed Sep 28 06:26:07 2022 ] 	Mean test loss of 258 batches: 0.697619728224222.
[ Wed Sep 28 06:26:07 2022 ] 	Top1: 78.88%
[ Wed Sep 28 06:26:07 2022 ] 	Top5: 96.02%
[ Wed Sep 28 06:26:07 2022 ] Training epoch: 70
[ Wed Sep 28 06:29:15 2022 ] 	Mean training loss: 0.5257. loss2: 0.0000. Mean training acc: 83.40%.
[ Wed Sep 28 06:29:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:29:15 2022 ] Eval epoch: 70
[ Wed Sep 28 06:29:45 2022 ] 	Mean test loss of 258 batches: 0.7904588462308396.
[ Wed Sep 28 06:29:45 2022 ] 	Top1: 77.06%
[ Wed Sep 28 06:29:45 2022 ] 	Top5: 95.57%
[ Wed Sep 28 06:29:45 2022 ] Training epoch: 71
[ Wed Sep 28 06:32:52 2022 ] 	Mean training loss: 0.5262. loss2: 0.0000. Mean training acc: 83.30%.
[ Wed Sep 28 06:32:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:32:53 2022 ] Eval epoch: 71
[ Wed Sep 28 06:33:22 2022 ] 	Mean test loss of 258 batches: 0.8763360188219899.
[ Wed Sep 28 06:33:22 2022 ] 	Top1: 75.54%
[ Wed Sep 28 06:33:22 2022 ] 	Top5: 94.72%
[ Wed Sep 28 06:33:22 2022 ] Training epoch: 72
[ Wed Sep 28 06:36:30 2022 ] 	Mean training loss: 0.5294. loss2: 0.0000. Mean training acc: 83.07%.
[ Wed Sep 28 06:36:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:36:30 2022 ] Eval epoch: 72
[ Wed Sep 28 06:36:59 2022 ] 	Mean test loss of 258 batches: 0.6426937485388083.
[ Wed Sep 28 06:36:59 2022 ] 	Top1: 81.02%
[ Wed Sep 28 06:36:59 2022 ] 	Top5: 96.42%
[ Wed Sep 28 06:36:59 2022 ] Training epoch: 73
[ Wed Sep 28 06:40:07 2022 ] 	Mean training loss: 0.5333. loss2: 0.0000. Mean training acc: 83.18%.
[ Wed Sep 28 06:40:07 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:40:07 2022 ] Eval epoch: 73
[ Wed Sep 28 06:40:37 2022 ] 	Mean test loss of 258 batches: 0.6733865677848343.
[ Wed Sep 28 06:40:37 2022 ] 	Top1: 80.01%
[ Wed Sep 28 06:40:37 2022 ] 	Top5: 95.98%
[ Wed Sep 28 06:40:37 2022 ] Training epoch: 74
[ Wed Sep 28 06:43:46 2022 ] 	Mean training loss: 0.5243. loss2: 0.0000. Mean training acc: 83.42%.
[ Wed Sep 28 06:43:46 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:43:46 2022 ] Eval epoch: 74
[ Wed Sep 28 06:44:16 2022 ] 	Mean test loss of 258 batches: 0.7023657976772434.
[ Wed Sep 28 06:44:16 2022 ] 	Top1: 78.46%
[ Wed Sep 28 06:44:16 2022 ] 	Top5: 96.47%
[ Wed Sep 28 06:44:16 2022 ] Training epoch: 75
[ Wed Sep 28 06:47:24 2022 ] 	Mean training loss: 0.5260. loss2: 0.0000. Mean training acc: 83.32%.
[ Wed Sep 28 06:47:24 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:47:24 2022 ] Eval epoch: 75
[ Wed Sep 28 06:47:54 2022 ] 	Mean test loss of 258 batches: 0.6930488611954128.
[ Wed Sep 28 06:47:54 2022 ] 	Top1: 79.58%
[ Wed Sep 28 06:47:54 2022 ] 	Top5: 95.76%
[ Wed Sep 28 06:47:54 2022 ] Training epoch: 76
[ Wed Sep 28 06:51:01 2022 ] 	Mean training loss: 0.5242. loss2: 0.0000. Mean training acc: 83.52%.
[ Wed Sep 28 06:51:01 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:51:01 2022 ] Eval epoch: 76
[ Wed Sep 28 06:51:30 2022 ] 	Mean test loss of 258 batches: 0.6748662932898647.
[ Wed Sep 28 06:51:30 2022 ] 	Top1: 79.98%
[ Wed Sep 28 06:51:31 2022 ] 	Top5: 96.35%
[ Wed Sep 28 06:51:31 2022 ] Training epoch: 77
[ Wed Sep 28 06:54:38 2022 ] 	Mean training loss: 0.5255. loss2: 0.0000. Mean training acc: 83.40%.
[ Wed Sep 28 06:54:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:54:38 2022 ] Eval epoch: 77
[ Wed Sep 28 06:55:08 2022 ] 	Mean test loss of 258 batches: 0.7474424537531165.
[ Wed Sep 28 06:55:08 2022 ] 	Top1: 78.27%
[ Wed Sep 28 06:55:08 2022 ] 	Top5: 95.99%
[ Wed Sep 28 06:55:08 2022 ] Training epoch: 78
[ Wed Sep 28 06:58:15 2022 ] 	Mean training loss: 0.5247. loss2: 0.0000. Mean training acc: 83.46%.
[ Wed Sep 28 06:58:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 06:58:15 2022 ] Eval epoch: 78
[ Wed Sep 28 06:58:45 2022 ] 	Mean test loss of 258 batches: 0.7060843285086543.
[ Wed Sep 28 06:58:45 2022 ] 	Top1: 78.21%
[ Wed Sep 28 06:58:45 2022 ] 	Top5: 96.38%
[ Wed Sep 28 06:58:45 2022 ] Training epoch: 79
[ Wed Sep 28 07:01:53 2022 ] 	Mean training loss: 0.5215. loss2: 0.0000. Mean training acc: 83.67%.
[ Wed Sep 28 07:01:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:01:53 2022 ] Eval epoch: 79
[ Wed Sep 28 07:02:23 2022 ] 	Mean test loss of 258 batches: 0.8273487374071002.
[ Wed Sep 28 07:02:23 2022 ] 	Top1: 77.28%
[ Wed Sep 28 07:02:23 2022 ] 	Top5: 94.73%
[ Wed Sep 28 07:02:23 2022 ] Training epoch: 80
[ Wed Sep 28 07:05:31 2022 ] 	Mean training loss: 0.5210. loss2: 0.0000. Mean training acc: 83.47%.
[ Wed Sep 28 07:05:31 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:05:31 2022 ] Eval epoch: 80
[ Wed Sep 28 07:06:00 2022 ] 	Mean test loss of 258 batches: 0.8040942811919737.
[ Wed Sep 28 07:06:00 2022 ] 	Top1: 76.94%
[ Wed Sep 28 07:06:00 2022 ] 	Top5: 95.06%
[ Wed Sep 28 07:06:00 2022 ] Training epoch: 81
[ Wed Sep 28 07:09:08 2022 ] 	Mean training loss: 0.5234. loss2: 0.0000. Mean training acc: 83.52%.
[ Wed Sep 28 07:09:08 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:09:08 2022 ] Eval epoch: 81
[ Wed Sep 28 07:09:37 2022 ] 	Mean test loss of 258 batches: 0.6666165177327718.
[ Wed Sep 28 07:09:37 2022 ] 	Top1: 79.93%
[ Wed Sep 28 07:09:37 2022 ] 	Top5: 96.60%
[ Wed Sep 28 07:09:38 2022 ] Training epoch: 82
[ Wed Sep 28 07:12:45 2022 ] 	Mean training loss: 0.5204. loss2: 0.0000. Mean training acc: 83.47%.
[ Wed Sep 28 07:12:45 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:12:45 2022 ] Eval epoch: 82
[ Wed Sep 28 07:13:14 2022 ] 	Mean test loss of 258 batches: 0.7650209455065025.
[ Wed Sep 28 07:13:14 2022 ] 	Top1: 77.06%
[ Wed Sep 28 07:13:14 2022 ] 	Top5: 96.05%
[ Wed Sep 28 07:13:14 2022 ] Training epoch: 83
[ Wed Sep 28 07:16:21 2022 ] 	Mean training loss: 0.5202. loss2: 0.0000. Mean training acc: 83.64%.
[ Wed Sep 28 07:16:21 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:16:21 2022 ] Eval epoch: 83
[ Wed Sep 28 07:16:51 2022 ] 	Mean test loss of 258 batches: 0.6794873198797536.
[ Wed Sep 28 07:16:51 2022 ] 	Top1: 79.99%
[ Wed Sep 28 07:16:51 2022 ] 	Top5: 96.22%
[ Wed Sep 28 07:16:51 2022 ] Training epoch: 84
[ Wed Sep 28 07:19:58 2022 ] 	Mean training loss: 0.5251. loss2: 0.0000. Mean training acc: 83.55%.
[ Wed Sep 28 07:19:58 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:19:58 2022 ] Eval epoch: 84
[ Wed Sep 28 07:20:27 2022 ] 	Mean test loss of 258 batches: 0.771519561955171.
[ Wed Sep 28 07:20:27 2022 ] 	Top1: 77.26%
[ Wed Sep 28 07:20:27 2022 ] 	Top5: 95.86%
[ Wed Sep 28 07:20:28 2022 ] Training epoch: 85
[ Wed Sep 28 07:23:35 2022 ] 	Mean training loss: 0.5228. loss2: 0.0000. Mean training acc: 83.60%.
[ Wed Sep 28 07:23:35 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:23:35 2022 ] Eval epoch: 85
[ Wed Sep 28 07:24:04 2022 ] 	Mean test loss of 258 batches: 0.6674356834255448.
[ Wed Sep 28 07:24:04 2022 ] 	Top1: 79.70%
[ Wed Sep 28 07:24:04 2022 ] 	Top5: 96.88%
[ Wed Sep 28 07:24:04 2022 ] Training epoch: 86
[ Wed Sep 28 07:27:12 2022 ] 	Mean training loss: 0.5195. loss2: 0.0000. Mean training acc: 83.41%.
[ Wed Sep 28 07:27:12 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:27:12 2022 ] Eval epoch: 86
[ Wed Sep 28 07:27:41 2022 ] 	Mean test loss of 258 batches: 0.6392164481587188.
[ Wed Sep 28 07:27:41 2022 ] 	Top1: 80.57%
[ Wed Sep 28 07:27:41 2022 ] 	Top5: 96.52%
[ Wed Sep 28 07:27:41 2022 ] Training epoch: 87
[ Wed Sep 28 07:30:48 2022 ] 	Mean training loss: 0.5167. loss2: 0.0000. Mean training acc: 83.76%.
[ Wed Sep 28 07:30:48 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:30:48 2022 ] Eval epoch: 87
[ Wed Sep 28 07:31:18 2022 ] 	Mean test loss of 258 batches: 0.6424160167690396.
[ Wed Sep 28 07:31:18 2022 ] 	Top1: 80.68%
[ Wed Sep 28 07:31:18 2022 ] 	Top5: 96.25%
[ Wed Sep 28 07:31:18 2022 ] Training epoch: 88
[ Wed Sep 28 07:34:25 2022 ] 	Mean training loss: 0.5271. loss2: 0.0000. Mean training acc: 83.44%.
[ Wed Sep 28 07:34:25 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:34:25 2022 ] Eval epoch: 88
[ Wed Sep 28 07:34:54 2022 ] 	Mean test loss of 258 batches: 0.8091741245846416.
[ Wed Sep 28 07:34:54 2022 ] 	Top1: 77.17%
[ Wed Sep 28 07:34:54 2022 ] 	Top5: 94.82%
[ Wed Sep 28 07:34:54 2022 ] Training epoch: 89
[ Wed Sep 28 07:38:02 2022 ] 	Mean training loss: 0.5247. loss2: 0.0000. Mean training acc: 83.53%.
[ Wed Sep 28 07:38:02 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:38:02 2022 ] Eval epoch: 89
[ Wed Sep 28 07:38:31 2022 ] 	Mean test loss of 258 batches: 0.6897987645379332.
[ Wed Sep 28 07:38:31 2022 ] 	Top1: 78.46%
[ Wed Sep 28 07:38:31 2022 ] 	Top5: 96.27%
[ Wed Sep 28 07:38:31 2022 ] Training epoch: 90
[ Wed Sep 28 07:41:38 2022 ] 	Mean training loss: 0.5267. loss2: 0.0000. Mean training acc: 83.41%.
[ Wed Sep 28 07:41:38 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:41:39 2022 ] Eval epoch: 90
[ Wed Sep 28 07:42:08 2022 ] 	Mean test loss of 258 batches: 0.794819851138795.
[ Wed Sep 28 07:42:08 2022 ] 	Top1: 76.31%
[ Wed Sep 28 07:42:08 2022 ] 	Top5: 95.92%
[ Wed Sep 28 07:42:08 2022 ] Training epoch: 91
[ Wed Sep 28 07:45:15 2022 ] 	Mean training loss: 0.3215. loss2: 0.0000. Mean training acc: 90.06%.
[ Wed Sep 28 07:45:15 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:45:15 2022 ] Eval epoch: 91
[ Wed Sep 28 07:45:44 2022 ] 	Mean test loss of 258 batches: 0.42318169183509297.
[ Wed Sep 28 07:45:45 2022 ] 	Top1: 87.25%
[ Wed Sep 28 07:45:45 2022 ] 	Top5: 97.99%
[ Wed Sep 28 07:45:45 2022 ] Training epoch: 92
[ Wed Sep 28 07:48:52 2022 ] 	Mean training loss: 0.2559. loss2: 0.0000. Mean training acc: 92.09%.
[ Wed Sep 28 07:48:52 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:48:52 2022 ] Eval epoch: 92
[ Wed Sep 28 07:49:22 2022 ] 	Mean test loss of 258 batches: 0.4270911603415197.
[ Wed Sep 28 07:49:22 2022 ] 	Top1: 87.27%
[ Wed Sep 28 07:49:22 2022 ] 	Top5: 97.94%
[ Wed Sep 28 07:49:22 2022 ] Training epoch: 93
[ Wed Sep 28 07:52:29 2022 ] 	Mean training loss: 0.2308. loss2: 0.0000. Mean training acc: 92.96%.
[ Wed Sep 28 07:52:29 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:52:29 2022 ] Eval epoch: 93
[ Wed Sep 28 07:52:59 2022 ] 	Mean test loss of 258 batches: 0.4170410423142503.
[ Wed Sep 28 07:52:59 2022 ] 	Top1: 87.65%
[ Wed Sep 28 07:52:59 2022 ] 	Top5: 98.00%
[ Wed Sep 28 07:52:59 2022 ] Training epoch: 94
[ Wed Sep 28 07:56:06 2022 ] 	Mean training loss: 0.2107. loss2: 0.0000. Mean training acc: 93.65%.
[ Wed Sep 28 07:56:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:56:06 2022 ] Eval epoch: 94
[ Wed Sep 28 07:56:35 2022 ] 	Mean test loss of 258 batches: 0.41240744692699394.
[ Wed Sep 28 07:56:36 2022 ] 	Top1: 87.80%
[ Wed Sep 28 07:56:36 2022 ] 	Top5: 98.06%
[ Wed Sep 28 07:56:36 2022 ] Training epoch: 95
[ Wed Sep 28 07:59:43 2022 ] 	Mean training loss: 0.1975. loss2: 0.0000. Mean training acc: 94.00%.
[ Wed Sep 28 07:59:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 07:59:43 2022 ] Eval epoch: 95
[ Wed Sep 28 08:00:12 2022 ] 	Mean test loss of 258 batches: 0.418319833197797.
[ Wed Sep 28 08:00:12 2022 ] 	Top1: 87.84%
[ Wed Sep 28 08:00:13 2022 ] 	Top5: 97.92%
[ Wed Sep 28 08:00:13 2022 ] Training epoch: 96
[ Wed Sep 28 08:03:20 2022 ] 	Mean training loss: 0.1872. loss2: 0.0000. Mean training acc: 94.39%.
[ Wed Sep 28 08:03:20 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:03:20 2022 ] Eval epoch: 96
[ Wed Sep 28 08:03:49 2022 ] 	Mean test loss of 258 batches: 0.4314771458622097.
[ Wed Sep 28 08:03:49 2022 ] 	Top1: 87.29%
[ Wed Sep 28 08:03:49 2022 ] 	Top5: 97.94%
[ Wed Sep 28 08:03:49 2022 ] Training epoch: 97
[ Wed Sep 28 08:06:56 2022 ] 	Mean training loss: 0.1756. loss2: 0.0000. Mean training acc: 94.85%.
[ Wed Sep 28 08:06:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:06:56 2022 ] Eval epoch: 97
[ Wed Sep 28 08:07:26 2022 ] 	Mean test loss of 258 batches: 0.4203252374201782.
[ Wed Sep 28 08:07:26 2022 ] 	Top1: 87.81%
[ Wed Sep 28 08:07:26 2022 ] 	Top5: 98.07%
[ Wed Sep 28 08:07:26 2022 ] Training epoch: 98
[ Wed Sep 28 08:10:33 2022 ] 	Mean training loss: 0.1676. loss2: 0.0000. Mean training acc: 95.06%.
[ Wed Sep 28 08:10:33 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:10:33 2022 ] Eval epoch: 98
[ Wed Sep 28 08:11:02 2022 ] 	Mean test loss of 258 batches: 0.4262409140319787.
[ Wed Sep 28 08:11:02 2022 ] 	Top1: 87.51%
[ Wed Sep 28 08:11:02 2022 ] 	Top5: 97.95%
[ Wed Sep 28 08:11:02 2022 ] Training epoch: 99
[ Wed Sep 28 08:14:10 2022 ] 	Mean training loss: 0.1595. loss2: 0.0000. Mean training acc: 95.32%.
[ Wed Sep 28 08:14:10 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:14:10 2022 ] Eval epoch: 99
[ Wed Sep 28 08:14:40 2022 ] 	Mean test loss of 258 batches: 0.44353242276250854.
[ Wed Sep 28 08:14:40 2022 ] 	Top1: 87.15%
[ Wed Sep 28 08:14:40 2022 ] 	Top5: 97.96%
[ Wed Sep 28 08:14:40 2022 ] Training epoch: 100
[ Wed Sep 28 08:17:49 2022 ] 	Mean training loss: 0.1497. loss2: 0.0000. Mean training acc: 95.64%.
[ Wed Sep 28 08:17:49 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:17:49 2022 ] Eval epoch: 100
[ Wed Sep 28 08:18:18 2022 ] 	Mean test loss of 258 batches: 0.4539213231945223.
[ Wed Sep 28 08:18:18 2022 ] 	Top1: 86.95%
[ Wed Sep 28 08:18:18 2022 ] 	Top5: 97.88%
[ Wed Sep 28 08:18:18 2022 ] Training epoch: 101
[ Wed Sep 28 08:21:26 2022 ] 	Mean training loss: 0.1257. loss2: 0.0000. Mean training acc: 96.48%.
[ Wed Sep 28 08:21:26 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:21:26 2022 ] Eval epoch: 101
[ Wed Sep 28 08:21:55 2022 ] 	Mean test loss of 258 batches: 0.41718029724650607.
[ Wed Sep 28 08:21:55 2022 ] 	Top1: 88.05%
[ Wed Sep 28 08:21:55 2022 ] 	Top5: 98.05%
[ Wed Sep 28 08:21:55 2022 ] Training epoch: 102
[ Wed Sep 28 08:25:03 2022 ] 	Mean training loss: 0.1157. loss2: 0.0000. Mean training acc: 96.86%.
[ Wed Sep 28 08:25:03 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:25:03 2022 ] Eval epoch: 102
[ Wed Sep 28 08:25:32 2022 ] 	Mean test loss of 258 batches: 0.4147902494079845.
[ Wed Sep 28 08:25:32 2022 ] 	Top1: 87.91%
[ Wed Sep 28 08:25:32 2022 ] 	Top5: 98.05%
[ Wed Sep 28 08:25:32 2022 ] Training epoch: 103
[ Wed Sep 28 08:28:40 2022 ] 	Mean training loss: 0.1060. loss2: 0.0000. Mean training acc: 97.22%.
[ Wed Sep 28 08:28:40 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:28:40 2022 ] Eval epoch: 103
[ Wed Sep 28 08:29:09 2022 ] 	Mean test loss of 258 batches: 0.4153627481658098.
[ Wed Sep 28 08:29:09 2022 ] 	Top1: 87.95%
[ Wed Sep 28 08:29:09 2022 ] 	Top5: 98.03%
[ Wed Sep 28 08:29:09 2022 ] Training epoch: 104
[ Wed Sep 28 08:32:17 2022 ] 	Mean training loss: 0.1065. loss2: 0.0000. Mean training acc: 97.14%.
[ Wed Sep 28 08:32:17 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:32:17 2022 ] Eval epoch: 104
[ Wed Sep 28 08:32:46 2022 ] 	Mean test loss of 258 batches: 0.4142666650546152.
[ Wed Sep 28 08:32:46 2022 ] 	Top1: 88.17%
[ Wed Sep 28 08:32:46 2022 ] 	Top5: 98.05%
[ Wed Sep 28 08:32:46 2022 ] Training epoch: 105
[ Wed Sep 28 08:35:53 2022 ] 	Mean training loss: 0.1036. loss2: 0.0000. Mean training acc: 97.24%.
[ Wed Sep 28 08:35:53 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:35:53 2022 ] Eval epoch: 105
[ Wed Sep 28 08:36:23 2022 ] 	Mean test loss of 258 batches: 0.4187917561927276.
[ Wed Sep 28 08:36:23 2022 ] 	Top1: 87.96%
[ Wed Sep 28 08:36:23 2022 ] 	Top5: 97.97%
[ Wed Sep 28 08:36:23 2022 ] Training epoch: 106
[ Wed Sep 28 08:39:30 2022 ] 	Mean training loss: 0.0993. loss2: 0.0000. Mean training acc: 97.46%.
[ Wed Sep 28 08:39:30 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:39:30 2022 ] Eval epoch: 106
[ Wed Sep 28 08:39:59 2022 ] 	Mean test loss of 258 batches: 0.4274025326089341.
[ Wed Sep 28 08:39:59 2022 ] 	Top1: 87.68%
[ Wed Sep 28 08:39:59 2022 ] 	Top5: 98.03%
[ Wed Sep 28 08:39:59 2022 ] Training epoch: 107
[ Wed Sep 28 08:43:06 2022 ] 	Mean training loss: 0.0989. loss2: 0.0000. Mean training acc: 97.44%.
[ Wed Sep 28 08:43:06 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:43:06 2022 ] Eval epoch: 107
[ Wed Sep 28 08:43:36 2022 ] 	Mean test loss of 258 batches: 0.4230449684253035.
[ Wed Sep 28 08:43:36 2022 ] 	Top1: 87.96%
[ Wed Sep 28 08:43:36 2022 ] 	Top5: 98.06%
[ Wed Sep 28 08:43:36 2022 ] Training epoch: 108
[ Wed Sep 28 08:46:43 2022 ] 	Mean training loss: 0.0956. loss2: 0.0000. Mean training acc: 97.60%.
[ Wed Sep 28 08:46:43 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:46:43 2022 ] Eval epoch: 108
[ Wed Sep 28 08:47:12 2022 ] 	Mean test loss of 258 batches: 0.42047562237915603.
[ Wed Sep 28 08:47:12 2022 ] 	Top1: 88.00%
[ Wed Sep 28 08:47:12 2022 ] 	Top5: 98.07%
[ Wed Sep 28 08:47:12 2022 ] Training epoch: 109
[ Wed Sep 28 08:50:19 2022 ] 	Mean training loss: 0.0945. loss2: 0.0000. Mean training acc: 97.57%.
[ Wed Sep 28 08:50:19 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:50:19 2022 ] Eval epoch: 109
[ Wed Sep 28 08:50:49 2022 ] 	Mean test loss of 258 batches: 0.43629898475988427.
[ Wed Sep 28 08:50:49 2022 ] 	Top1: 87.54%
[ Wed Sep 28 08:50:49 2022 ] 	Top5: 98.00%
[ Wed Sep 28 08:50:49 2022 ] Training epoch: 110
[ Wed Sep 28 08:53:56 2022 ] 	Mean training loss: 0.0890. loss2: 0.0000. Mean training acc: 97.77%.
[ Wed Sep 28 08:53:56 2022 ] 	Time consumption: [Data]02%, [Network]98%
[ Wed Sep 28 08:53:56 2022 ] Eval epoch: 110
[ Wed Sep 28 08:54:25 2022 ] 	Mean test loss of 258 batches: 0.4239589345114407.
[ Wed Sep 28 08:54:25 2022 ] 	Top1: 88.03%
[ Wed Sep 28 08:54:26 2022 ] 	Top5: 98.02%
[ Wed Sep 28 08:54:55 2022 ] Best accuracy: 0.8816643416024746
[ Wed Sep 28 08:54:55 2022 ] Epoch number: 104
[ Wed Sep 28 08:54:55 2022 ] Model name: work_dir/ntu60/csub/fc_vel
[ Wed Sep 28 08:54:55 2022 ] Model total number of params: 2082097
[ Wed Sep 28 08:54:55 2022 ] Weight decay: 0.0004
[ Wed Sep 28 08:54:55 2022 ] Base LR: 0.1
[ Wed Sep 28 08:54:55 2022 ] Batch Size: 64
[ Wed Sep 28 08:54:55 2022 ] Test Batch Size: 64
[ Wed Sep 28 08:54:55 2022 ] seed: 1
